534 research outputs found

    An Introduction to Using Software Tools for Automatic Differentiation

    Full text link
    We give a gentle introduction to using various software tools for automatic differentiation (AD). Ready-to-use examples are discussed, and links to further information are presented. Our target audience includes all those who are looking for a straightforward way to get started using the available AD technology. The document is dynamic in the sense that its content will be updated as the AD software evolves.Comment: 23 page

    Reduction of the Random Access Memory Size in Adjoint Algorithmic Differentiation by Overloading

    Full text link
    Adjoint algorithmic differentiation by operator and function overloading is based on the interpretation of directed acyclic graphs resulting from evaluations of numerical simulation programs. The size of the computer system memory required to store the graph grows proportional to the number of floating-point operations executed by the underlying program. It quickly exceeds the available memory resources. Naive adjoint algorithmic differentiation often becomes infeasible except for relatively simple numerical simulations. Access to the data associated with the graph can be classified as sequential and random. The latter refers to memory access patterns defined by the adjacency relationship between vertices within the graph. Sequentially accessed data can be decomposed into blocks. The blocks can be streamed across the system memory hierarchy thus extending the amount of available memory, for example, to hard discs. Asynchronous i/o can help to mitigate the increased cost due to accesses to slower memory. Much larger problem instances can thus be solved without resorting to technically challenging user intervention such as checkpointing. Randomly accessed data should not have to be decomposed. Its block-wise streaming is likely to yield a substantial overhead in computational cost due to data accesses across blocks. Consequently, the size of the randomly accessed memory required by an adjoint should be kept minimal in order to eliminate the need for decomposition. We propose a combination of dedicated memory for adjoint LL-values with the exploitation of remainder bandwidth as a possible solution. Test results indicate significant savings in random access memory size while preserving overall computational efficiency

    Una descripción inédita de Guayaquil

    Get PDF
    4 p.La gran descripción de Guayaquil, la mejor de todas, es la que en 1774 hizo el ingeniero don Francisco Requena, a quien se deben también un plano del puerto, un mapa de toda la provincia y numerosos informes sobre su situación defensiva, sanitaria, etc. (obra publicada por Mª Luisa Laviana en Historiografía y Bibliografía Americanistas, XXVI, 1982, y reeditada como libro por la Escuela de Estudios Hispano-Americanos en 1984). Pero ahora damos a conocer una pequeña descripción que con el título de “Relación de Guayaquil y plano de la ciudad” se encuentra en el Servicio Histórico Militar de Madrid. Tiene el indudable valor de proporcionar una rápida y pintoresca visión del Guayaquil de la década de 1770, a la vez que su misma brevedad facilita su publicación.Peer reviewe

    Parallel Jacobian Accumulation

    Get PDF

    Algorithmic Differentiation of Numerical Methods: Second-order Adjoint Solvers for Parameterized Systems of Nonlinear Equations

    Get PDF
    AbstractAdjoint mode algorithmic (also know as automatic) differentiation (AD) transforms implementations of multivariate vector functions as computer programs into first-order adjoint code. Its reapplication or combinations with tangent mode AD yields higher-order adjoint code. Second derivatives play an important role in nonlinear programming. For example, second-order (Newton-type) nonlinear optimization methods promise faster convergence in the neighborhood of the minimum through taking into account second derivative information. The adjoint mode is of particular interest in large-scale gradient-based nonlinear optimization due to the independence of its computational cost on the number of free variables. Part of the objective function may be given implicitly as the solution of a system of n parameterized nonlinear equations. If the system parameters depend on the free variables of the objective, then second derivatives of the nonlinear system's solution with respect to those parameters are required. The local computational overhead as well as the additional memory requirement for the computation of second-order adjoints of the solution vector with respect to the parameters by AD depends on the number of iterations performed by the nonlinear solver. This dependence can be eliminated by taking a symbolic approach to the differentiation of the nonlinear system
    corecore